25 research outputs found

    Automated Analysis of Synchronization in Human Full-body Expressive Movement

    Get PDF
    The research presented in this thesis is focused on the creation of computational models for the study of human full-body movement in order to investigate human behavior and non-verbal communication. In particular, the research concerns the analysis of synchronization of expressive movements and gestures. Synchronization can be computed both on a single user (intra-personal), e.g., to measure the degree of coordination between the joints\u2019 velocities of a dancer, and on multiple users (inter-personal), e.g., to detect the level of coordination between multiple users in a group. The thesis, through a set of experiments and results, contributes to the investigation of both intra-personal and inter-personal synchronization applied to support the study of movement expressivity, and improve the state-of-art of the available methods by presenting a new algorithm to perform the analysis of synchronization

    Informing bowing and violin learning using movement analysis and machine learning

    Get PDF
    Violin performance is characterized by an intimate connection between the player and her instrument that allows her a continuous control of sound through a sophisticated bowing technique. A great importance in violin pedagogy is, then, given to techniques of the right hand, responsible of most of the sound produced. This study analyses the bowing trajectory in three different classical violin exercises from audio and motion capture recordings to classify, using machine learning techniques, the different kinds of bowing techniques used. Our results show that a clustering algorithm is able to appropriately group together the different shapes produced by the bow trajectories

    A serious games platform for validating sonification of human full-body movement qualities

    Get PDF
    In this paper we describe a serious games platfrom for validating sonification of human full-body movement qualities. This platform supports the design and development of serious games aiming at validating (i) our techniques to measure expressive movement qualities, and (ii) the mapping strategies to translate such qualities in the auditory domain, by means of interactive sonification and active music experience. The platform is a part of a more general framework developed in the context of the EU ICT H2020 DANCE "Dancing in the dark" Project n.645553 that aims at enabling the perception of nonverbal artistic whole-body experiences to visual impaired people

    Movement Fluidity Analysis Based on Performance and Perception

    Get PDF
    In this work we present a framework and an experimental approach to investigate human body movement qualities (i.e., the expressive components of non-verbal communication) in HCI. We first define a candidate movement quality conceptually, with the involvement of experts in the field (e.g., dancers, choreographers). Next, we collect a dataset of performances and we evaluate the perception of the chosen quality. Finally, we propose a computational model to detect the presence of the quality in a movement segment and we compare the outcomes of the model with the evaluation results. In the proposed on-going work, we apply this approach to a specific quality of movement: Fluidity. The proposed methods and models may have several applications, e.g., in emotion detection from full-body movement, interactive training of motor skills, rehabilitation

    Analysis of intrapersonal synchronization in full-body movements displaying different expressive qualities

    Get PDF
    Intrapersonal synchronization of limb movements is a relevant feature for assessing coordination of motoric behavior. In this paper, we show that it can also distinguish between full-body movements performed with different expressive qualities, namely rigidity, uidity, and impulsivity. For this purpose, we collected a dataset of movements performed by professional dancers, and annotated the perceived movement qualities with the help of a group of experts in expressive movement analysis. We computed intra personal synchronization by applying the Event Synchronization algorithm to the time-series of the speed of arms and hands. Results show that movements performed with different qualities display a significantly different amount of intra personal synchronization: Impulsive movements are the most synchronized, the uid ones show the lowest values of synchronization, and the rigid ones lay in between

    Analysis of the qualities of human movement in individual action

    Get PDF
    The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Project \u201cDance in the Dark\u201d (H2020 ICT Project n.645553 - http://dance.dibris.unige.it). The main objective of the DANCE project is to study and develop novel techniques and algorithms for the automated measuring of non-verbal bodily expression and the emotional qualities conveyed by human movement, in order to enable the perception of nonverbal artistic whole-body experiences to visual impaired people. In the framework of the eNTERFACE \u201915 Workshop we investigated methods for analyzing human movements in terms of expressive qualities. When analyzing an individual action we were mainly concentrating on the quality of motion and on elements suggesting different emotions. We developed a system to automatically extract several movement features and transfer them to the auditory domain through interactive sonification. We performed an experiment with 26 participants and collected a dataset made of video and audio recordings plus accelerometer data. Finally, we performed a perception study through questionnaires, in order to evaluate and validate the system. As real time application of our system we developed a game named \u201dMove in the Dark\u201d, which has been presented in the Mundaneum Museum of Mons, Belgium and Festival della Scienza, Genova, Italy (27 November 2015)

    MIE 2017: 1st international workshop on multimodal interaction for education (workshop summary)

    Get PDF
    The International Workshop on Multimodal Interaction for Education aims at investigating how multimodal interactive systems, firmly grounded on psychophysical, psychological, and pedagogical bases, can be designed, developed, and exploited for enhancing teaching and learning processes in different learning environments, with a special focus on children in the classroom. Whilst the usage of multisensory technologies in the education area is rapidly expanding, the need for solid scientific bases, design guidelines, and appropriate procedures for evaluation is emerging. Moreover, the introduction of multimodal interactive systems in the learning environment needs to develop at the same time suitable pedagogical paradigms. This workshop aims at bringing together researchers and practitioners from different disciplines, including pedagogy, psychology, psychophysics, and computer science - with a particular focus on human-computer interaction, affective computing, and social signal processing - to discuss such challenges under a multidisciplinary perspective. The workshop is partially supported by the EU-H2020-ICT Project weDRAW (http://www.wedraw.eu)

    Designing multimodal interactive systems using EyesWeb XMI

    Get PDF
    This paper introduces the EyesWeb XMI platform (for eXtended Multimodal Interaction) as a tool for fast prototyping of multimodal systems, including interconnection of multiple smart devices, e.g., smartphones. EyesWeb is endowed with a visual programming language enabling users to compose modules into applications. Modules are collected in several libraries and include support of many input devices (e.g., video, audio, motion capture, accelerometers, and physiological sensors), output devices (e.g., video, audio, 2D and 3D graphics), and synchronized multimodal data processing. Specific libraries are devoted to real-time analysis of nonverbal expressive motor and social behavior. The EyesWeb platform encompasses further tools such EyesWeb Mobile supporting the development of customized Graphical User Interfaces for specific classes of users. The paper will review the EyesWeb platform and its components, starting from its historical origins, and with a particular focus on the Human-Computer Interaction aspects

    Automatic Detection of Reflective Thinking in Mathematical Problem Solving based on Unconstrained Bodily Exploration

    Get PDF
    For technology (like serious games) that aims to deliver interactive learning, it is important to address relevant mental experiences such as reflective thinking during problem solving. To facilitate research in this direction, we present the weDraw-1 Movement Dataset of body movement sensor data and reflective thinking labels for 26 children solving mathematical problems in unconstrained settings where the body (full or parts) was required to explore these problems. Further, we provide qualitative analysis of behaviours that observers used in identifying reflective thinking moments in these sessions. The body movement cues from our compilation informed features that lead to average F1 score of 0.73 for binary classification of problem-solving episodes by reflective thinking based on Long Short-Term Memory neural networks. We further obtained 0.79 average F1 score for end-to-end classification, i.e. based on raw sensor data. Finally, the algorithms resulted in 0.64 average F1 score for subsegments of these episodes as short as 4 seconds. Overall, our results show the possibility of detecting reflective thinking moments from body movement behaviours of a child exploring mathematical concepts bodily, such as within serious game play
    corecore